Likelihood Based Statistical Inference in Hidden Markov Models
نویسندگان
چکیده
The hidden Markov model (HMM) is one type of stochastic signal model which is widely used in modeling and classi cation problems. Yet the more advanced statistical inference in this model has been omitted almost in all applications. In this paper we show how to calculate in practise the likelihood based con dence intervals for model parameters and further for the probability of a new case to be classi ed, and how these intervals can be used to provide some useful insights into the HMM. First the con dence intervals for the values of the model parameters tell the su cency of the sample data in the modeling problem. In addition, the con dence intervals for the probabilities of a new case tell the uncertainty of the classi cation based on the pure probability in classi cation problem. We show in detail how to compute two di erent con dence intervals, namely the Wald's and the pro le likelihood intervals. We also demonstrate and compare the results of the two approaches in a real example of classi cation of nasal ow shapes. We found out that this kind of statistical inference of HMMs is very useful and informative, and we recommend that it should be used regulary in the applications of HMMs.
منابع مشابه
Computation of Restricted Maximum-penalized-likelihood Estimates in Hidden Markov Models
The maximum-penalized-likelihood estimation for hidden Markov models with general observation densities is described. All statistical inference, including the model estimation, testing, and selection, is based on the restricted optimization of the penalized likelihood function with respect to the chosen model family. The method is used in an economic application, where stock market index return...
متن کاملProbability , Statistics , and Computational Science Niko
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur ...
متن کاملBayesian Methods in Biological Sequence Analysis
Hidden Markov models, the expectation–maximization algorithm, and the Gibbs sampler were introduced for biological sequence analysis in early 1990s. Since then the use of formal statistical models and inference procedures has revolutionized the field of computational biology. This chapter reviews the hidden Markov and related models, as well as their Bayesian inference procedures and algorithms...
متن کاملInference in Hidden Markov Models I: Local Asymptotic Normality in the Stationary Case
Following up on Baum and Petrie (1966) we study likelihood based methods in hidden Markov models, where the hiding mechanism can lead to continuous observations and is itself governed by a parametric model. We show that procedures essentially equivalent to maximum likelihood estimates are asymptotically normal as expected and consistent estimates of their variance can be constructed, so that th...
متن کاملStatistical Inference in Autoregressive Models with Non-negative Residuals
Normal residual is one of the usual assumptions of autoregressive models but in practice sometimes we are faced with non-negative residuals case. In this paper we consider some autoregressive models with non-negative residuals as competing models and we have derived the maximum likelihood estimators of parameters based on the modified approach and EM algorithm for the competing models. Also,...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1999